Goto

Collaborating Authors

 wilson loop


Physics-informed neural network solves minimal surfaces in curved spacetime

Hashimoto, Koji, Kyo, Koichi, Murata, Masaki, Ogiwara, Gakuto, Tanahashi, Norihiro

arXiv.org Artificial Intelligence

We develop a flexible framework based on physics-informed neural networks (PINNs) for solving boundary value problems involving minimal surfaces in curved spacetimes, with a particular emphasis on singularities and moving boundaries. By encoding the underlying physical laws into the loss function and designing network architectures that incorporate the singular behavior and dynamic boundaries, our approach enables robust and accurate solutions to both ordinary and partial differential equations with complex boundary conditions. We demonstrate the versatility of this framework through applications to minimal surface problems in anti-de Sitter (AdS) spacetime, including examples relevant to the AdS/CFT correspondence (e.g. Wilson loops and gluon scattering amplitudes) popularly used in the context of string theory in theoretical physics. Our methods efficiently handle singularities at boundaries, and also support both "soft" (loss-based) and "hard" (formulation-based) imposition of boundary conditions, including cases where the position of a boundary is promoted to a trainable parameter. The techniques developed here are not limited to high-energy theoretical physics but are broadly applicable to boundary value problems encountered in mathematics, engineering, and the natural sciences, wherever singularities and moving boundaries play a critical role.


Learning Chern Numbers of Topological Insulators with Gauge Equivariant Neural Networks

Huang, Longde, Balabanov, Oleksandr, Linander, Hampus, Granath, Mats, Persson, Daniel, Gerken, Jan E.

arXiv.org Artificial Intelligence

Equivariant network architectures are a well-established tool for predicting invariant or equivariant quantities. However, almost all learning problems considered in this context feature a global symmetry, i.e. each point of the underlying space is transformed with the same group element, as opposed to a local ``gauge'' symmetry, where each point is transformed with a different group element, exponentially enlarging the size of the symmetry group. Gauge equivariant networks have so far mainly been applied to problems in quantum chromodynamics. Here, we introduce a novel application domain for gauge-equivariant networks in the theory of topological condensed matter physics. We use gauge equivariant networks to predict topological invariants (Chern numbers) of multiband topological insulators. The gauge symmetry of the network guarantees that the predicted quantity is a topological invariant. We introduce a novel gauge equivariant normalization layer to stabilize the training and prove a universal approximation theorem for our setup. We train on samples with trivial Chern number only but show that our models generalize to samples with non-trivial Chern number. We provide various ablations of our setup. Our code is available at https://github.com/sitronsea/GENet/tree/main.


Physics-Conditioned Diffusion Models for Lattice Gauge Theory

Zhu, Qianteng, Aarts, Gert, Wang, Wei, Zhou, Kai, Wang, Lingxiao

arXiv.org Artificial Intelligence

We develop diffusion models for simulating lattice gauge theories, where stochastic quantization is explicitly incorporated as a physical condition for sampling. We demonstrate the applicability of this novel sampler to U(1) gauge theory in two spacetime dimensions and find that a model trained at a small inverse coupling constant can be extrapolated to larger inverse coupling regions without encountering the topological freezing problem. Additionally, the trained model can be employed to sample configurations on different lattice sizes without requiring further training. The exactness of the generated samples is ensured by incorporating Metropolis-adjusted Langevin dynamics into the generation process. Furthermore, we demonstrate that this approach enables more efficient sampling of topological quantities compared to traditional algorithms such as Hybrid Monte Carlo and Langevin simulations.


Deep learning lattice gauge theories

Apte, Anuj, Ashmore, Anthony, Cordova, Clay, Huang, Tzu-Chen

arXiv.org Artificial Intelligence

Monte Carlo methods have led to profound insights into the strong-coupling behaviour of lattice gauge theories and produced remarkable results such as first-principles computations of hadron masses. Despite tremendous progress over the last four decades, fundamental challenges such as the sign problem and the inability to simulate real-time dynamics remain. Neural network quantum states have emerged as an alternative method that seeks to overcome these challenges. In this work, we use gauge-invariant neural network quantum states to accurately compute the ground state of $\mathbb{Z}_N$ lattice gauge theories in $2+1$ dimensions. Using transfer learning, we study the distinct topological phases and the confinement phase transition of these theories. For $\mathbb{Z}_2$, we identify a continuous transition and compute critical exponents, finding excellent agreement with existing numerics for the expected Ising universality class. In the $\mathbb{Z}_3$ case, we observe a weakly first-order transition and identify the critical coupling. Our findings suggest that neural network quantum states are a promising method for precise studies of lattice gauge theory.


Preserving gauge invariance in neural networks

Favoni, Matteo, Ipp, Andreas, Müller, David I., Schuh, Daniel

arXiv.org Machine Learning

In these proceedings we present lattice gauge equivariant convolutional neural networks (L-CNNs) which are able to process data from lattice gauge theory simulations while exactly preserving gauge symmetry. We review aspects of the architecture and show how L-CNNs can represent a large class of gauge invariant and equivariant functions on the lattice. We compare the performance of L-CNNs and non-equivariant networks using a non-linear regression problem and demonstrate how gauge invariance is broken for non-equivariant models.


Lattice gauge equivariant convolutional neural networks

Favoni, Matteo, Ipp, Andreas, Müller, David I., Schuh, Daniel

arXiv.org Machine Learning

Institute for Theoretical Physics, TU Wien, Austria (Dated: December 25, 2020) We propose Lattice gauge equivariant Convolutional Neural Networks (L-CNNs) for generic machine learning applications on lattice gauge theoretical problems. At the heart of this network structure is a novel convolutional layer that preserves gauge equivariance while forming arbitrarily shaped Wilson loops in successive bilinear layers. We demonstrate that L-CNNs can learn and generalize gauge invariant quantities that traditional convolutional neural networks are incapable of finding. Gauge field theories are an important cornerstone of larger symmetry space is available [33]. This impressive result was transported along a given closed path.

  arxiv, phy, wilson loop, (14 more...)
2012.12901
  Country:
  Genre: Research Report (0.50)